Optimal Spherical Separability: Towards Optimal Kernel Design

نویسندگان

  • Garimella Rama Murthy
  • Yaparla Ganesh
  • Rhishi Pratap Singh
چکیده

In this research paper, the concept of hyperspherical/hyper-ellipsoidal separability is introduced. Method of arriving at the optimal hypersphere (maximizing margin) separating two classes is discussed. By projecting the quantized patterns into higher dimensional space (as in encoders of error correcting code), the patterns are made hyper-spherically separable. Single/multiple layers of spherical/ellipsoidal neurons are proposed for multi-class classification. An associative memory based on hyper-ellipsoidal neuron is proposed. The problem of optimal kernel design is discussed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Support Vector Machine Classification using Mahalanobis Distance Function

Support Vector Machine (SVM) is a powerful technique for data classification. The SVM constructs an optimal separating hyper-plane as a decision surface, to divide the data points of different categories in the vector space. The Kernel functions are used to extend the concept of the optimal separating hyper-plane for the non-linearly separable cases so that the data can be linearly separable. T...

متن کامل

Learning Kernel Parameters by using Class Separability Measure

Learning kernel parameters is important for kernel based methods because these parameters have significant impact on the generalization abilities of these methods. Besides the methods of Cross-Validation and Leave-One-Out, minimizing some upper bounds on the generalization error, such as the radius-margin bound, was also proposed to more efficiently learn the optimal kernel parameters. In this ...

متن کامل

Optimal Spherical Separability: Artificial Neural Networks

In this research paper, the concept of hyper-spherical/hyperellipsoidal separability is introduced. Method of arriving at the optimal hypersphere (maximizing margin) separating two classes is discussed. By projecting the quantized patterns into higher dimensional space (as in encoders of error correcting code), the patterns are made hyper-spherically separable. Single/multiple layers of spheric...

متن کامل

Optimizing kernel parameters by second-order methods

Radial basis function network (RBF) kernels are widely used for support vector machines (SVMs). But for model selection of an SVM, we need to optimize the kernel parameter and the margin parameter by time-consuming cross validation. In this paper we propose determining parameters for RBF and Mahalanobis kernels by maximizing the class separability by the second-order optimization. For multi-cla...

متن کامل

Determine the Kernel Parameter of KFDA Using a Minimum Search Algorithm

In this paper, we develop a novel approach to perform kernel parameter selection for Kernel Fisher discriminant analysis (KFDA) based on the viewpoint that optimal kernel parameter is associated with the maximum linear separability of samples in the feature space. This makes our approach for selecting kernel parameter of KFDA completely comply with the essence of KFDA. Indeed, this paper is the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016